Inference Compilation and Universal Probabilistic Programming
نویسندگان
چکیده
We introduce a method for using deep neural networks to amortize the cost of inference in models from the family induced by universal probabilistic programming languages, establishing a framework that combines the strengths of probabilistic programming and deep learning methods. We call what we do “compilation of inference” because our method transforms a denotational specification of an inference problem in the form of a probabilistic program written in a universal programming language into a trained neural network denoted in a neural network specification language. When at test time this neural network is fed observational data and executed, it performs approximate inference in the original model specified by the probabilistic program. Our training objective and learning procedure are designed to allow the trained neural network to be used as a proposal distribution in a sequential importance sampling inference engine. We illustrate our method on mixture models and Captcha solving and show significant speedups in the efficiency of inference.
منابع مشابه
Improvements to Inference Compilation for Probabilistic Programming in Large-Scale Scientific Simulators
We consider the problem of Bayesian inference in the family of probabilistic models implicitly defined by stochastic generative models of data. In scientific fields ranging from population biology to cosmology, low-level mechanistic components are composed to create complex generative models. These models lead to intractable likelihoods and are typically non-differentiable, which poses challeng...
متن کاملCompiling Probabilistic Logic Programs into Sentential Decision Diagrams
Knowledge compilation algorithms transform a probabilistic logic program into a circuit representation that permits efficient probability computation. Knowledge compilation underlies algorithms for exact probabilistic inference and parameter learning in several languages, including ProbLog, PRISM, and LPADs. Developing such algorithms involves a choice, of which circuit language to target, and ...
متن کاملAnytime Inference in Probabilistic Logic Programs with Tp-Compilation
Existing techniques for inference in probabilistic logic programs are sequential: they first compute the relevant propositional formula for the query of interest, then compile it into a tractable target representation and finally, perform weighted model counting on the resulting representation. We propose TP -compilation, a new inference technique based on forward reasoning. TP -compilation pro...
متن کاملTP-Compilation for inference in probabilistic logic programs
We propose TP-compilation, a new inference technique for probabilistic logic programs that is based on forward reasoning. TP-compilation proceeds incrementally in that it interleaves the knowledge compilation step for weighted model counting with forward reasoning on the logic program. This leads to a novel anytime algorithm that provides hard bounds on the inferred probabilities. The main diff...
متن کاملA Compilation Target for Probabilistic Programming Languages
Forward inference techniques such as sequential Monte Carlo and particle Markov chain Monte Carlo for probabilistic programming can be implemented in any programming language by creative use of standardized operating system functionality including processes, forking, mutexes, and shared memory. Exploiting this we have defined, developed, and tested a probabilistic programming language intermedi...
متن کامل